Coupled Singular Value Decomposition of a Cross-Covariance Matrix

نویسندگان

  • Alexander Kaiser
  • Wolfram Schenck
  • Ralf Möller
چکیده

We derive coupled on-line learning rules for the singular value decomposition (SVD) of a cross-covariance matrix. In coupled SVD rules, the singular value is estimated alongside the singular vectors, and the effective learning rates for the singular vector rules are influenced by the singular value estimates. In addition, we use a first-order approximation of Gram-Schmidt orthonormalization as decorrelation method for the estimation of multiple singular vectors and singular values. Experiments on synthetic data show that coupled learning rules converge faster than Hebbian learning rules and that the first-order approximation of Gram-Schmidt orthonormalization produces more precise estimates and better orthonormality than the standard deflation method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Symbolic computation of the Duggal transform

Following the results of cite{Med}, regarding the Aluthge transform of polynomial matrices, the symbolic computation of the Duggal transform of a polynomial matrix $A$ is developed in this paper, using the polar decomposition and the singular value decomposition of $A$. Thereat, the polynomial singular value decomposition method is utilized, which is an iterative algorithm with numerical charac...

متن کامل

Graph Clustering by Hierarchical Singular Value Decomposition with Selectable Range for Number of Clusters Members

Graphs have so many applications in real world problems. When we deal with huge volume of data, analyzing data is difficult or sometimes impossible. In big data problems, clustering data is a useful tool for data analysis. Singular value decomposition(SVD) is one of the best algorithms for clustering graph but we do not have any choice to select the number of clusters and the number of members ...

متن کامل

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

Analysis of Eigenspace Dynamics with Applications to Array Processing

For an N-element array (Fig.1(a)), methods such as beamforming and singular value decomposition rely on estimation of the sample covariance matrix, computed from M independent data snapshots. As ∞ → M , the sample covariance is a consistent estimator of the true population covariance. However, this ideal condition cannot be met in most practical situations, in which large-aperture arrays operat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • International journal of neural systems

دوره 20 4  شماره 

صفحات  -

تاریخ انتشار 2010